A genetic approach for training diverse classifier ensembles
نویسندگان
چکیده
Classification is an active topic of Machine Learning. The most recent achievements in this domain suggest using ensembles of learners instead of a single classifier to improve classification accuracy. Comparisons between Bagging and Boosting show that classifier ensembles perform better when their members exhibit diversity, that is commit different errors. This paper proposes a genetic algorithm to design classifier ensembles, using a fitness function based on both accuracy and diversity. The proposed implementation has been run on several UCI Machine Learning datasets and compared to the performances obtained with bagging algorithm and a single classifier of the same nature.
منابع مشابه
Constructing Diverse Classifier Ensembles using Artificial Training Examples
Ensemble methods like bagging and boosting that combine the decisions of multiple hypotheses are some of the strongest existing machine learning methods. The diversity of the members of an ensemble is known to be an important factor in determining its generalization error. This paper presents a new method for generating ensembles that directly constructs diverse hypotheses using additional arti...
متن کاملCreating diversity in ensembles using artificial data
The diversity of an ensemble of classifiers is known to be an important factor in determining its generalization error. We present a new method for generating ensembles, Decorate (Diverse Ensemble Creation by Oppositional Relabeling of Artificial Training Examples), that directly constructs diverse hypotheses using additional artificially-constructed training examples. The technique is a simple...
متن کاملBagging and Boosting for the Nearest Mean Classifier: Effects of Sample Size on Diversity and Accuracy
In combining classifiers, it is believed that diverse ensembles perform better than non-diverse ones. In order to test this hypothesis, we study the accuracy and diversity of ensembles obtained in bagging and boosting applied to the nearest mean classifier. In our simulation study we consider two diversity measures: the Q statistic and the disagreement measure. The experiments, carried out on f...
متن کاملAn Approach for Assimilatiion of Classifier Ensembles on the Basis of Feature Selection and Diversity by Majority Voting and Bagging
A Classifier Ensemble (CE) efficiently improves the generalization ability of the classifier compared to a single classifier. This paper proposes an alternate approach for Integration of classifier ensembles. Initially three classifiers that are highly diverse and showed good classification accuracy when applied to six UCI (University of California, Irvine) datasets are selected. Then Feature S...
متن کاملRandom Convolution Ensembles
A novel method for creating diverse ensembles of image classifiers is proposed. The idea is that, for each base image classifier in the ensemble, a random image transformation is generated and applied to all of the images in the labeled training set. The base classifiers are then learned using features extracted from these randomly transformed versions of the training data, and the result is a ...
متن کامل